6,211 research outputs found

    Development of an open-source platform for calculating losses from earthquakes

    Get PDF
    Risk analysis has a critical role in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible risk assessment numerical tools and software. As a response to this need, the Global Earthquake Model (GEM) started the development of an open source platform called OpenQuake, for calculating seismic hazard and risk at different scales. Along with this framework, also several other tools to support users creating their own models and visualizing their results are currently being developed, and will be made available as a Modelers Tool Kit (MTK). In this paper, a description of the architecture of OpenQuake is provided, highlighting the current data model, workflow of the calculators and the main challenges raised when running this type of calculations in a global scale. In addition, a case study is presented using the Marmara Region (Turkey) for the calculations, in which the losses for a single event are estimated, as well as probabilistic risk for a 50 years time span

    Developing a global risk engine

    Get PDF
    Risk analysis is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible risk assessment software. However, there is a significant disparity between the high quality scientific data developed by researchers and the availability of versatile, open and user-friendly risk analysis tools to meet the demands of end-users. In the past few years several open-source software have been developed that play an important role in the seismic research, such as OpenSHA and OpenSEES. There is however still a gap when it comes to open-source risk assessment tools and software. In order to fill this gap, the Global Earthquake Model (GEM) has been created. GEM is an internationally sanctioned program initiated by the OECD that aims to build independent, open standards to calculate and communicate earthquake risk around the world. This initiative started with a one-year pilot project named GEM1, during which an evaluation of a number of existing risk software was carried out. After a critical review of the results it was concluded that none of the software were adequate for GEM requirements and therefore, a new object-oriented tool was to be developed. This paper presents a summary of some of the most well known applications used in risk analysis, highlighting the main aspects that were considered for the development of this risk platform. The research that was carried out in order to gather all of the necessary information to build this tool was distributed in four different areas: information technology approach, seismic hazard resources, vulnerability assessment methodologies and sources of exposure data. The main aspects and findings for each of these areas will be presented as well as how these features were incorporated in the up-to-date risk engine. Currently, the risk engine is capable of predicting human or economical losses worldwide considering both deterministic and probabilistic-based events, using vulnerability curves. A first version of GEM will become available at the end of 2013. Until then the risk engine will continue to be developed by a growing community of developers, using a dedicated open-source platform

    Evaluation of analytical methodologies to derive vulnerability functions

    Get PDF
    The recognition of fragility functions as a fundamental tool in seismic risk assessment has led to the development of more and more complex and elaborate procedures for their computation. Although vulnerability functions have been traditionally produced using observed damage and loss data, more recent studies propose the employment of analytical methodologies as a way to overcome the frequent lack of post-earthquake data. The variation of the structural modelling approaches on the estimation of building capacity has been the target of many studies in the past, however, its influence in the resulting vulnerability model, impact in loss estimations or propagation of the uncertainty to the seismic risk calculations has so far been the object of restricted scrutiny. Hence, in this paper, an extensive study of static and dynamic procedures for estimating the nonlinear response of buildings has been carried out in order to evaluate the impact of the chosen methodology on the resulting vulnerability and risk outputs. Moreover, the computational effort and numerical stability provided by each approach were evaluated and conclusions were obtained regarding which one offers the optimal balance between accuracy and complexity

    Extending displacement-based earthquake loss assessment (DBELA) for the computation of fragility curves

    Get PDF
    This paper presents a new procedure to derive fragility functions for populations of buildings that relies on the displacement-based earthquake loss assessment (DBELA) methodology. In the method proposed herein, thousands of synthetic buildings have been produced considering the probabilistic distribution describing the variability in geometrical and material properties. Then, their nonlinear capacity has been estimated using the DBELA method and their response against a large set of ground motion records has been estimated. Global limit states are used to estimate the distribution of buildings in each damage state for different levels of ground motion, and a regression algorithm is applied to derive fragility functions for each limit state. The proposed methodology is demonstrated for the case of ductile and non-ductile Turkish reinforced concrete frames with masonry infills

    Parameter identification in continuum models

    Get PDF
    Approximation techniques for use in numerical schemes for estimating spatially varying coefficients in continuum models such as those for Euler-Bernoulli beams are discussed. The techniques are based on quintic spline state approximations and cubic spline parameter approximations. Both theoretical and numerical results are presented

    Run-time parallelization and scheduling of loops

    Get PDF
    Run time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases, where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run time, wave fronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run time reordering of loop indices can have a significant impact on performance. Furthermore, the overheads associated with this type of reordering are amortized when the loop is executed several times with the same dependency structure

    The Marinhieros Project: Roseneath Rd & Patterson Ave

    Get PDF
    In questioning the very nature of a thing, at its most basic level, a new assessment can be made of what the thing in question truly is. When we ask ourselves, what is a weed, we begin to pull the word apart - to decrypt the word from the cultural baggage that has collected around it over the course of the history of language.The cultural connotations of \u27weed\u27 cling to it like barnacles, removing the word from its true value. We reevaluate meaning, chronicling all the possible constructions of a word, all the possible varieties, where it came from, what its uses are, etc. We can then begin to develop an aggregate meaning based on an inherently more textured meaning, nuanced and built to sustain an elaboration of new information within the word itself. Weeds may serve as a successful metaphor for humanities quest for value, but it should not be assumed - we must first plot a course before we set sail

    Property: Condominium: What Place--Space

    Get PDF

    Methods for the identification of material parameters in distributed models for flexible structures

    Get PDF
    Theoretical and numerical results are presented for inverse problems involving estimation of spatially varying parameters such as stiffness and damping in distributed models for elastic structures such as Euler-Bernoulli beams. An outline of algorithms used and a summary of computational experiences are presented
    • ā€¦
    corecore